The quantum relative entropy as a rate function and information criteria

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relative entropy in quantum information theory

We review the properties of the quantum relative entropy function and discuss its application to problems of classical and quantum information transfer and to quantum data compression. We then outline further uses of relative entropy to quantify quantum entanglement and analyze its manipulation. 1 Quantum relative entropy In this paper we discuss several uses of the quantum relative entropy fun...

متن کامل

Rate Distortion Function for a Class of Relative Entropy Sources

This paper deals with rate distortion or source coding with fidelity criterion, in measure spaces, for a class of source distributions. The class of source distributions is described by a relative entropy constraint set between the true and a nominal distribution. The rate distortion problem for the class is thus formulated and solved using minimax strategies, which result in robust source codi...

متن کامل

The role of relative entropy in quantum information theory

Quantum mechanics and information theory are among the most important scientific discoveries of the last century. Although these two areas initially developed separately, it has emerged that they are in fact intimately related. In this review the author shows how quantum information theory extends traditional information theory by exploring the limits imposed by quantum, rather than classical, ...

متن کامل

Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...

متن کامل

Relative entropy as a measure of diagnostic information.

Relative entropy is a concept within information theory that provides a measure of the distance between two probability distributions. The author proposes that the amount of information gained by performing a diagnostic test can be quantified by calculating the relative entropy between the posttest and pretest probability distributions. This statistic, in essence, quantifies the degree to which...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Quantum Information Processing

سال: 2013

ISSN: 1570-0755,1573-1332

DOI: 10.1007/s11128-013-0540-x